146 research outputs found

    Orthogonal projections are optimal algorithms

    Get PDF
    AbstractSome results on worst case optimal algorithms and recent results of J. Traub, G. Wasilkowski, and H. Woźniakowski on average case optimal algorithms are unified. By the use of Housholder transformations it is shown that orthogonal projections onto the range of the adjoint of the information operator are, in a very general sense, optimal algorithms. This allows a unified presentation of average case optimal algorithms relative to Gaussian measures on infinite dimensional Hilbert spaces. The choice of optimal information is also discussed

    Efficient First Order Methods for Linear Composite Regularizers

    Get PDF
    A wide class of regularization problems in machine learning and statistics employ a regularization term which is obtained by composing a simple convex function \omega with a linear transformation. This setting includes Group Lasso methods, the Fused Lasso and other total variation methods, multi-task learning methods and many more. In this paper, we present a general approach for computing the proximity operator of this class of regularizers, under the assumption that the proximity operator of the function \omega is known in advance. Our approach builds on a recent line of research on optimal first order optimization methods and uses fixed point iterations for numerically computing the proximity operator. It is more general than current approaches and, as we show with numerical simulations, computationally more efficient than available first order methods which do not achieve the optimal rate. In particular, our method outperforms state of the art O(1/T) methods for overlapping Group Lasso and matches optimal O(1/T^2) methods for the Fused Lasso and tree structured Group Lasso.Comment: 19 pages, 8 figure

    Biorthogonal multivariate filter banks from centrally symmetric matrices

    Get PDF
    AbstractWe provide a practical characterization of block centrally symmetric and anti-symmetric matrices which arise in the construction of multivariate filter banks and use these matrices for the construction of biorthogonal filter banks with linear phase

    Universal Kernels

    Get PDF
    In this paper we investigate conditions on the features of a continuous kernel so that it may approximate an arbitrary continuous target function uniformly on any compact subset of the input space. A number of concrete examples are given of kernels with this universal approximating property

    On Sparsity Inducing Regularization Methods for Machine Learning

    Full text link
    During the past years there has been an explosion of interest in learning methods based on sparsity regularization. In this paper, we discuss a general class of such methods, in which the regularizer can be expressed as the composition of a convex function ω\omega with a linear function. This setting includes several methods such the group Lasso, the Fused Lasso, multi-task learning and many more. We present a general approach for solving regularization problems of this kind, under the assumption that the proximity operator of the function ω\omega is available. Furthermore, we comment on the application of this approach to support vector machines, a technique pioneered by the groundbreaking work of Vladimir Vapnik.Comment: 12 pages. arXiv admin note: text overlap with arXiv:1104.143
    corecore